Source Coding for Synthesizing Correlated Randomness
نویسندگان
چکیده
We consider a scenario wherein two parties Alice and Bob are provided $X_{1}^{n}$ notation="LaTeX">$X_{2}^{n}$ – samples that IID from PMF notation="LaTeX">$P_{X_{1} X_{2}}$ . can communicate to Charlie over (noiseless) communication links of rates notation="LaTeX">$R_{1}$ notation="LaTeX">$R_{2}$ , respectively. Their goal is enable generate notation="LaTeX">$Y^{n}$ such the triple notation="LaTeX">$(X_{1}^{n},X_{2}^{n},Y^{n})$ has close, in total variation, notation="LaTeX">$\prod P_{X_{1} X_{2} Y}$ enabling three achieve strong coordination. In addition, may posses pairwise shared common randomness at notation="LaTeX">$C_{1}$ notation="LaTeX">$C_{2}$ address problem characterizing set rate quadruples notation="LaTeX">$(R_{1},R_{2},C_{1},C_{2})$ for which above be accomplished. propose new coding scheme based on random algebraic codes–coset codes particular–of asymptotically large block-length. analyze its performance derive single-letter information-theoretic inner bound. This bound subsumes largest known improves upon it strictly identified examples. Our findings build variant soft-covering generalizes applicability code ensembles. we provide an outer region this party distributed setup.
منابع مشابه
Distributed Joint Source-Channel Coding for arbitrary memoryless correlated sources and Source coding for Markov correlated sources using LDPC codes
In this paper, we give a distributed joint source channel coding scheme for arbitrary correlated sources for arbitrary point in the Slepian-Wolf rate region, and arbitrary link capacities using LDPC codes. We consider the Slepian-Wolf setting of two sources and one destination, with one of the sources derived from the other source by some correlation model known at the decoder. Distributed enco...
متن کاملDistributed Source Coding of Correlated Gaussian Sources
We consider the distributed source coding system of L correlated Gaussian sources Yl, l = 1, 2, · · · , L which are noisy observations of correlated Gaussian remote sources Xk, k = 1, 2, · · · ,K. We assume that Y L = (Y1, Y2, · · · , YL) is an observation of the source vector X = (X1, X2, · · · , XK), having the form Y L = AX+N, where A is a L×K matrix and N = (N1, N2, · · · , NL) is a vector ...
متن کاملSource Coding Algorithms Using the Randomness of a Past Sequence
We propose source coding algorithms that use the randomness of a past sequence. The proposed algorithms solve the problems of multi-terminal source coding, rate-distortion source coding, and source coding with partial side information at the decoder. We analyze the encoding rate and the decoding error rate in terms of almost-sure convergence. key words: bits-back coding, lossy source coding, mu...
متن کاملDistributed Source Coding for Correlated Memoryless Gaussian Sources
We consider a distributed source coding problem of L correlated Gaussian observations Yi, i = 1, 2, · · · , L. We assume that the random vector Y L = t(Y1, Y2, · · · , YL) is an observation of the Gaussian random vector X = t(X1, X2, · · · , XK), having the form Y L = AX +N , where A is a L×K matrix and N = t(N1, N2, · · · , NL) is a vector of L independent Gaussian random variables also indepe...
متن کاملRobust Lossy Source Coding for Correlated Fading Channels
Most of the conventional communication systems use channel interleaving as well as hard decision decoding in their designs, which lead to discarding channel memory and soft-decision information. This simplification is usually done since the complexity of handling the memory or soft-decision information is rather high. In this work, we design two lossy joint source-channel coding (JSCC) schemes ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 2023
ISSN: ['0018-9448', '1557-9654']
DOI: https://doi.org/10.1109/tit.2022.3196186